From:                              route@monster.com

Sent:                               Tuesday, June 04, 2013 3:54 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Big Data

 

This resume has been forwarded to you at the request of Monster User xapeix01

Jaymin Patel 

Last updated:  04/02/12

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Charlotte, NC  28277
US

Mobile:   
Home:

patel_jaymin@yahoo.com

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: JayminPatel

Resume Value: 7s87sd3i6zeqmgnt   

  

 

JAYMIN   PATEL

 

980-297-4855                                                                                                                           patel_jaymin@yahoo.com

 

Summary:                 15 years of experience in

·         Architecting and implementing Business Intelligence (BI)/Analytics systems, Enterprise Data Integrations/Warehouses (DW), Data marts, Hubs and Operational Data Stores.

·         Conversant with Big Data architecture and processing framework (Hadoop File Systems, MapReduce, Pig, Hive, Oozie etc.)

·         In-depth, hands-on knowledge of Data/Information Architecture, Metadata Management, Data Governance, Data warehousing methodologies, designing of complex dimensional  schema, Enterprise wide data warehouses and data marts.

·         Successfully played roles of Data Architect, Technical Project Manager, Team/Delivery lead and even Lead Developer to deliver full life cycle implementation of BI Applications, Enterprise Data Warehouse in closed-loop hub-spoke architecture, ODS and BI Analytics

·         Right blend of technical, functional and people skills required to craft information management   processes and solutions from vision to implementation

·         Performed comparative analysis and unbiased recommendations in ETL and BI tools evaluation and criteria development

·         Strong technical background combined with very hands-on approach helped deliver on-time and effective solutions

Major Clients/Industry served :

Technology

Google (Advertisement platforms) , Informatica Corporation (analytics)

Financial Services/Banking

Largest Banking & financial client ( Risk Analysis)

Wachovia Bank (Investment Banking)

TIAA-CREF,  Accredited Home Lenders (Mortgage)

Housing Finance and State Finance agencies (India)

Utilities

Duke Energy

Media/Entertainment

Walt Disney, Time-Warner Cable, Google

Retail

Gateway Computers, National Dairy Development Board (India)

Manufacturing/Auto

Nissan Motors,Toyota Financial Services, GE Plastics

 

Technical Skills :             

Data Modelling

Erwin – 3.x/4 x/r 7 Model-Mart, Embarcadero ER-Studio, Sybase PowerDesigner 15.x

ETL

Informatica PowerCenter  8.x/7.1.x/6.x/5.x/4.7 

Talend Integration Studio (4.1/4.2)

KETL (open-source Kinetics ETL).  SSIS/DTS (SQL Server)

BI /Reporting

Cognos ReportNet 1.x, Cognos PowerPlay 7, Infa PowerAnalyzer 4.x,

Business Objects 4.x/5.0 , SAP Business Warehouse 2.1 (Class-room Trained), MicroStrategy 7.0 (Class-room Trained), QlikView, Pentaho BI suite

Programming

Python, Perl, Shell , SQL, PL/SQL, C, XML

Database

Netezza 4.6/3.x, Oracle 11/10g/9i/8.x/7.x,   SQL Server 2000 2005,   DB2 7.x,

  MySQL 5.0

Others

WinCVS, VSS, STAT, CA-Harvest, Perforce version control

AutoSys, Redwood , Tidal Schedulers

MS Project, NIKU Work-bench, Visio productivity software

                   

Publications :

Article/White Paper "Seven simple rules for successful Real-time Business Intelligence implementation"
Published on DM Review.com ( May 2005 Edition)
Link : http://www.information-management.com/infodirect/20050506/1027144-1.html

Education :             

                            MBA with specialization in Finance, Gujarat University, India 1997

                            BS Electronics & Communications,  Gujarat University, India  1994             

 

Certifications/Training :             

                            Cloudera Certified Hadoop Developer

                            Informatica Certified Consultant (Admin and Designer,Expert level in PowerCenter 1/4/5.x)

                            Netezza Advanced Concepts class by IBM

                            SpotFire Reporting user training by Tibco

                             Project Management (PMP) Training by local PMP Chapter

                                                 

Project Summary:

 

Client :  Leading Bank , Charlotte ,NC                                                                                                                      Mar 11 - Present  

Data Architect/ Data Integration Architect/Lead developer for Quantitave Risk Technology group 

Environment Netezza Performance Server (NPS - 4.6), Talend Integration Studio 4.x, Tibco SpotFire, Moody’s RiskFrontier Analytics platform, Hadoop cluster, Hive, Oozie, Sqoop

 

As a Big Data Architect/Developer

·         Designed and developed Netezza Multi-Migrate utility to replicate data across NZ databases/hosts

·         Designed storage and archival mechanism for large back-ups of Netezza on to Hadoop File Systems (HDFS) 

·         Established Directory structure for HDFS storage & Hive metastore

·         Developed ETL process and standards using Sqoop, Hive and Oozie to benchmark high-volume data extracts from Teradata into Netezza

·         Designed Hive-Metastore to be used as staging/landing area for big data. Created Hive-QL scripts to perform simple transformations within HDFS before transporting to Netezza Data Warehouse

·         Working on integration of Autosys and Oozie to introduce cross-platform dependencies for HDFS loads

 

As a Data Architect

·         Collaborated with Consumer and Commercial line-of-business users to refine data warehouse requirements

·         Designed Netezza Databases for Staging and Enterprise Data Mart focusing on various consumer and commercial data sources to enable Risk Profiling and Analysis of portfolios

·         Played instrumental role in designing database for self-service Web Application which enabled Quantitative Analysts to create Dynamic Segmentation rules and execute various aggregation on terabytes of data

·         Compared and Evaluated Data Quality/Profiling tools (DataFlux and Talend Data Profiler) for QA Analysts

 

As a Data Integration Architect/Lead developer

·         Designed Integration Layer using Talend, Shell Scripts and Netezza SQLs to extract terabytes of data

·         Developed and reviewed Netezza SQL scripts to ensure they adhere to Best Practices

·         Desgined near-Real Time ETL processes to support Dynamic Segmentation and Profiling

·         Developed Integration methods to Send/Receive portfolio data to/from Moody’s RistFrontier Analytics system

 

Client :  Duke Energy , Charlotte ,NC                                                                                                                      Aug 09 - Mar 11  

ETL Architect within Integration Center of Excellence (ICC) Group

Environment Informatica 8.1/8.6 , SQL Server 2005 , Oracle 10g

·         Guide and provide ETL architecture best practices and technical expertise to various project teams

·         Planned and Coordinated Informatica upgrades and resolved technical issues like codepage compatibility etc.

Data Integration Architect for Enterprise Asset Management, Workforce HUB

·         Worked with Data architects and functional users to design system controls and auditability

·         Review logical and physical models from ETL and auditability perspective Review ETL specifications and ETL code developed by onsite/offshore developers

·         Designed and Developed complex/real-time ETLs to interface with IBM MQ messaging systems

Environment Informatica 8.6/8.1 , SQL Server 2005/2008, ESP scheduler, IBM MQ

 

Client :  Wachovia Bank , Charlotte ,NC                                                                                                                      Apr 08 – July 09  

Data Integration Architect for FACS-Datawarehouse

Environment Informatica 8.1/8.6 , SQL Server 2005 , SSRS (Reporting Studio), Unix, Autosys

·         Collaborated with Business partners and Application Owners to hash out data sourcing and transmission requirements for various trading systems.

·         Contributed in desiging automated reconciliation process for transactions and positions between trading system sources ( like Calypso, IBIS, OPICS ) and FACS data warehouse

·         Laying foundation and best practices for Financial Accounting and Control (FACS) Datawarehouse ETL layer

·         Re-designed file-based Sources integration to provide audit statistics and traceability for every piece of data in each pipeline e.g. file arrival time, byte size, record-counts and audit column sum at each stage

·         Authored migration guidelines and Informatica/UNIX specific development guideline and naming standards including reusable components across various systems, dynamic parameter files

·         Mentor on-site and off-shore development teams and provide Informatica development Best Practices

 

Client : Google , Mountain View,CA                                                                                                                        Apr 07 – Mar 08  

Data Architect/Team lead for HR Data Mart ( Apr-Oct 07)

·         Led team of 3-7 members to deliver various HR subject area BI reporting through Agile methodologies.

·         Created and maintained Data Model repositories for HR related Conformed Dimensions and HR Data Warehouse/subject area marts into Erwin/All-Fusion  Model Mart Repository

·         Architected Conformed Dimensions for Google’s Internal Data Mart subject areas and contributed in developing common platform (bus architecture ) for Dimensions like Employee, Organization, Country/Currencies, Sales Org etc.

·         Performed Data Security/Sensitivity analysis and Classified attibutes into Sensitive (SI) , Personal Information (PI) and Persionally Identifiable Information (PII) 

·         Facilitated User-Acceptance Testing for MicroStrategy BI reports/ Dashboards along with end-to-end data analysis for reported bugs and resolutions

·         Worked with different upstream and downstream consumers/publishers to refine Conformed Dimensions

·         Designed and deployed ETL code-base for various subject areas like Applicant Tracking pipeline, Google-HR operational reporting, Performance Appraisal and Conformed dimensions (related to HR) .

·         Created XML/Web-service based integration layer between HR source systems and HR-DW for source data dumps and integration protocols

·         Co-architected ETL layer for all various subject areas using XML/SOAP architecture  and Oracle SQL tools.

·         Transitioned the support functions to IBM Global Services team with Knowledge transfer sessions

Data Integration/ETL Consultant/Analyst for Ads-Datawarehouse /CRM Data mart (Oct 07 – Apr 08)

·         Designed, Developed and managed ETL architecture written in Python/Shell/XML to bring Google’s Advertisement transactions into DW for facilitating metrics like clicks,impressions,cost/$ to be reported across variety of dimensions like regions,platforms/web-properties, time and industry verticals.

·         Built few key fact tables in Ads-Data Warehouse (16 TB) on Netezza appliances.

·         Led an effort to collaborate external demographic data feeds from D&B,Thomson etc. for Gap-analysis between Google existing customer,prospects and respective media-spends.

·         Worked as Business analyst/ETL consultant for CRM-Data Mart focusing on various metrics across Google Sales/CRM team hierarchy, Campaigns and their performance.

·         Managed international users (EMEA) requirements through virtual meetings and strawman exercise before putting together POC and ETL design.

Environment: Netezza 3.0.5/4.0 ,Python, XML/SOAP ,Microstrategy 7.x/8.1,, Oracle 10g , KETL (Opensource Kinetics ETL), XML, Unix

 

Client : TIAA-CREF, Charlotte                                                                                                                                      Nov 06 –Apr 07  

Data Architect, ETL Architect   for   HR Data Mart

TIAA-CREF is $380 billion financial services group focusing on services to academic, medical, cultural, and research fields for more than 85 years. HR Data Mart initiative aims to create a central repository for all employee information including benefits, compensation, staffing, workforce metrics and performance management.

·         Reviewed prevailing DW architecture and presented recommendations for possible areas of improvement in data mart design, ETL design and overall business processes for data mart implementation

·         Worked with Legal and Compliance groups to ensure adherence to Corporate Data Security standards and other compliances. Prepared Strategy document/guidelines for FTE and Contractor data separation strategies and presented that with proposed approach to various groups

·         Worked with HR Application implantation teams to identify data attributes important for HR data Mart and created Data Dictionary and Element Glossary for business users

·         Established ETL development best practices (Informatica), ETL design specifications  and assisted developers in ETL development

·         Work closely with database and Informatica administrators to manage TL migration processes with adequate SDLC documentation

Environment: ERwin Data Modeller, Informatica PowerCenter 7.1,  Oracle 10g, Unix, Actuate

 

Client : Time Warner Cable, Charlotte                                                                                                        Nov 05 –Oct 06  

DW-ETL Architect/Team Lead   for  Video-On-Demand (VOD) Data Mart                                                                                                        

Time Warner Cable (TWC) is one of the largest cable and media service providers in the country. Video-on-demand services are offered by all regions and divisions of TWC to customers. The VOD Data Mart project involved sourcing regional VOD application (from 33 or more divisions) and provided metrics like number of sessions, time spent on each commercials or movies etc. across various dimensions like asset(titles), categories and genre. VOD Mart provided ad-hoc as well as analytical capabilities to business users for performance measurement as well as it provides vital information for better planning and focusing on marketing efforts.

Responsibilities

·         Led and supported business requirements sessions with various Marketing group users (Analysts to Directors)

·         Designed Dimensional Model and ETL architecture ( involving Encryption mechanism for sensitive data) for retrieving information from VOD applications

·         Prepared and presented  time/resource estimates for sub-phases and other potential Data Mart initiatives.

·         Assisted team technically for complex ETL program development and performance tuning of data loads

·         Facilitated weekly status calls with various user-groups as well as IT management

·         Co-ordinated efforts and transition with various support groups for systems implementation procedures

·         Evaluated new projects/reporting requests and provided scope/approach recommendation along with effort estimations and time-line projections

Environment: Oracle 9i/10g , Informatica PowerCenter 7.1, Cognos ReportNet 1.0, Cognos PowerPlay 7

 

Client : Disney, Buena Vista TV, CA                                                                                                                  Oct 04 –Nov 05  

Data Architect/ETL architect at Zeus Operational Data Store (ODS)                                                                             

Buena Vista Television (BVTV) is unit of The Walt Disney Group engaged in media distribution across the world.  ZEUS application was being developed to enable Contract Management and respective Accounts Receivable management for BVTV.  For faster and better reporting without affecting the ZEUS application,  ODS project aimed to build an Operational Data Store sourcing data mainly from upcoming ZEUS application along with other forecasting systems.  ODS will serve as close- to-real-time source for all operational reporting requirements.

·         Analyzed contract management operational and executive reporting requirements and ZEUS source system structures/data flows

·         Designed logical and physical models for ODS. Worked closely with Cognos Architects to understand Cognos Framework and tuned ODS design accordingly. Conducted ODS Model overview presentations and review sessions with various support/business groups and ZEUS project teams

·         Involved in ETL tool selection and evaluated Informatica, Embarcadero DT Studio along with DB2 propogator (for real-time data integration)

·         Designed ETL architecture, established specifications and standards/best-practices for Informatica programs

·         Managed team of 4 developers. Established and maintained ETL Task assignments, Status reports. Monitored and communicated the progress and quality of deliverables

·         Prepared detailed integration test plan for end-to-end data load testing.

 

Environment: Informatica PowerCenter 7.1, Cognos ReportNet 1.0 , IBM DB2 7.2/8, Embarcadero ER-Studio

Sources : Custom-built Contract Management System, Cognos Planning Application, Excel spreadsheets.
 

Client : Accredited Home Lenders, San Diego                                                                                                    June 03 –Oct 04  

Production Data Mart (Near Real Time ) at Accredited Home Lenders

DW-ETL Architect/Technical Lead                                                                                                       June 03-July 04

Accredited Home Lender (AHL) is premier mortgage lending company focused on non-prime market. This project focused on capturing and providing near Real-Time information about loan origination and the entire life cycle of the loan processing. This project enabled AHL management to monitor and analyze efficiency of their loan processing system and productivity of their employees. The Production Data mart Dashboards and reports are used by over 500 users in various levels of AHL management all over the US on a real-time basis.

Responsibilities:             

·         Understood Loan-origination business processes, analysed Data mart reporting requirements and source systems through series of JAD sessions and  user interviews

·         Designed Data Mart architecture and implemented real-time data warehouse sourcing from various Loan origination software packages to meet 15-minute data mart refresh intervals.

·         Developed complex ETL mappings and provided technical guidance to other resources

·         Partnered with client BI manager and analysts in order to facilitate BI-Dashboards and other critical reports requirements and design.

·         Managed development team of 4 developers and BI Architect. Organized development team status meetings and co-ordinated progress communication to Project Manager and other Client managers

·         Contributed to analysis, design and testing documentation deliverables at end of each phase/deliverables.

Environment: Informatica PowerCenter 6.2, PowerAnalyzer 4.1 , SQL Server 8.0 , Windows NT

Sources : EMPOWER package ( from Fidelity ), OutlookSoft package (planning & budgeting ), legacy   OLTPs.
 

Loan Servicing Data Mart at Accredited Home Lenders

Requirements Lead/Data Architect                                                                                                           July 04-Oct 04

Building on the huge success of Production Data Mart, this project focused on capturing and providing information related to loan servicing and subsequent their sales in capital markets. The broader objective of the project was to leverage the conformed dimensions built in Production Data Mart and provide AHL management one version of truth and ability to analyze information in cross-functional perspective.

Responsibilities:             

·         Carried out interviews and overview sessions with multiple business functional groups including Cash-Flow management, Loan-Servicing, Legal-Compliance, Deliquency/Collections, Foreclosure, Finance and Capital Markets etc. to get business process understanding.

·         Analyzed Data mart and reporting requirements and source systems through series of user interviews

·         Prepared the Discovery Phase document as well as Roadmap document to provide recommendations about business process enhancements, recommended architecture and approach for Servicing Data Mar

·         Prepared high level logical model and integration requirements for confirmed dimensions (from Production Data Mart)

 

Client : Nissan , CA                                                                                                                                              Dec 02 – Jun 03  

Regional Data Warehouse at Nissan , USA

ETL Architect/Lead Developer                                                                                                                    

The objective of this Data warehouse was to consolidate financial data (General Ledger) of all the North American subsidiaries of Nissan Motor Sales. Source systems included SAP FI and other customized accounting systems. The final data warehouse presented the data consolidated by various critical dimensions like Vehicle models, geography etc.
Responsibilities:             

·         Designed and implemented ETL architecture for this DW, which was based on financial data (SAP ERP) from all subsidiaries of Nissan, North America.

·         Performed analysis of consolidation processes of financial results for Nissan subsidiaries in order to build the similar automated ETL process.

·         Automated file-transfers, file-validations and file-cleansing processes by using UNIX scripts. Performance tuned historical and monthly data loads.

Environment: Informatica PowerCenter 6.1, UNIX, DB2 7.0, Hyperion

Sources: SAP FI, CO modules. Custom built Accounting Packages for smaller subsidiaries

Client : Toyota Financial Services , CA                                                                                                                  Dec 01 –Dec 02  

Insurance Data Warehouse at Toyota Financial Services

ETL Architect/Lead Developer, System Tester                                                                                        

Toyota Financial Services provides Insurance and Service Agreements/Warranties products to Toyota automobile buyers. The Insurance Data Warehouse gathered all the insurance and Service agreements/Warranties information as well as Claims information from various legacy mainframe transaction systems and third party databases like POLK. The claims and warranties facts were organized by critical dimensions like Customer, Dealer, Geographies, Make, Model and Year of vehicles etc.

Responsibilities:             

·         Analysed, designed and developed ETL processes for Insurance Data Mart and its components

·         Converted legacy COBOL code into Informatica mappings to improve performance and easy maintenance.

·         Designed Data reconciliation and Audit processes and specification documents

·         Developed Informatica Batch Control and scheduling module with help of UNIX, PL/SQL and Informatica mappings. Performance tuned bottleneck ETL processes with design and SQL improvements.

·         Actively participated in System Testing of data mart by preparing system test cycles design document, generating test data and execution of system test plans.

·         Worked on Requirement Analysis and design of major enhancement for the same data-mart and created Data-model impact analysis, ETL impact analysis and design specifications.

·         Developed with client resources an offshore outsourcing plan to migrate maintenance and enhancements to an offshore model.

Environment :              Informatica PowerCenter 5.1,Microstrategy,Oracle 8.1.7,UNIX,Redwood,ERWin

Sources     :   Mainframe/DB2 legacy system , POLK Vehicle data,MS Access,Flat file extracts

 

Client : Informatica Corporation , CA                                                                                                                  Jan 01 –Nov 01

Informatica Applications development at  Informatica Corporation

ETL Architect/Developer , QA Analyst                                                                                                      

Informatica is market leader in Data integration and Business Intelligence software. This project was carried out by its Application Business Unit to develop pre-packaged Analytics Product suites. In a nutshell, these products will  minimize all the development time for a typical data warehouse project by its ready to plug business adapters ( for various leading CRM and ERP software) and pre-packaged canned reports ready to deploy. Thus, it is an end-to-end Business Intelligence solution ready to use after minimal customization.

Responsibilities:             

·         Designed and developed extraction logics for IADP (Informatica Analytical Delivery Platform) products. The system delivers web based enterprise-wide analytic solution built on dimensional data model. It can source data using pre-built extractors from ERP systems and custom applications on any ODBC compliant database. Enterprise data can be analyzed using pre-built reporting metrics covering areas like Business Operations Analytics (BOA), Supply Chain Analytics (SCA), Customer Relationship Analytics (CRA) and Web Channel Analytics (WCA).

·         Performed Data validations, Unit testing, integration testing, business scenario data generation and testing and Performance Tuning of Customer Relationship Analytics Product suite.

·         Automated QA procedures/non-compliance reports with help of Informatica Metadata Queries, PL/SQL and Unix Shell Scripts. These programs eliminated many manual QA tasks for repository testing.

·         Performed QA activities including unit testing and Universe/repository testing for Pre-packaged BusinessObjects Reports.

Environment :              Informatica PowerCenter 5.1, Oracle 8.1.7, BusinessObjects, UNIX, Windows NT

Client : Gateway Computers, Denver                                                                                                                  Apr 00 –Jan 01

CKM/CM (Customer Knowledge Management/Campaign Management) data warehouse

ETL Developer, QA Engineer                                                                                                                     

Customer Knowledge Management (CKM) data warehouse focused on all current and prospect customers of Gateway Computers. It sourced from different transaction systems, third party customer repositories like Duns & Bradstreet, Axiom and involved data-cleansing routines using FirstLogic tools. CM (Campaign Management) Mart focused on information required to analyze various campaigns and its effectiveness and ROI.

Responsibilities:             

·         Analysed, designed, developed and implemented ETL Processes (using Informatica mappings, Database procedures and UNIX scripts) for CKM/CM datawarehouse/Marts and ODS which captures existing and prospect customers, campaign and order details.

·         Designed and developed  load control and audit  repository using Informatica Metadata , PL/SQL and Unix scripts

·         Prepared design specifications and scripts for views and summary tables required to utilize Micro strategy tool for various subject areas.

Environment :              Informatica PowerCenter 4.7,Microstrategy 6, Oracle 8.1.5,UNIX,Valex(Campaign)

Sources :               JDE OneWorld, Vantive (CRM), Dun & Bradstreet feeds ,              Customer Profiling Data              (Axiom)

Client : National Dairy Development Board,  India                                                                                       Feb 00- Apr 00 

Web-based Monitoring System for NDDB (National Dairy Development Board), India.

Requirements Analysis Team Lead/Data Modeller                                                                                   

This Monitoring system aimed to provide web-based solution for remote dairy units of the companies to facilitate entry of accurate summary data and to provide Corporate planning units to analyze Key Performance Indicators using various drill-down and trend analysis capabilities.

Responsibilities:             

·         Performed Requirements Gathering and analysis of the Web-based Performance Monitoring system to monitor the growth of all milk collecting unions spread over the country. This system scope also included the MIS reports with OLAP query tools for the top management.

·         Designed OLTP data model for the system and Denormalized schema for summary tables.

·         Prepared a presentation of OLAP tools and Techniques (BusinessObjects) for the top management of the client.

Environment :              BusinessObjects 4.1.4 (web intelligence ) , Oracle 8.0.5,HTML , Javascript , COM

Client : GE Plastics, MA                                                                                                                                                 Apr 99- Jan 00 

GEMMS Data Warehouse project at GE plastics, USA.

Business Analyst/Front-end & ETL developer                                                                                    

GEMMS Data warehouse focused on Costing analysis of different Plastic manufacturing plants of GE Plastics.   There were originally six different identical data warehouses existing before this project for each of the plants. This project involved consolidation and enhancements of various functionalities for better analytical capabilities.

Responsibilities:             

·         Gathered  functional requirements and designed specifications for BusinessObjects reports

·         Created and maintained BusinessObjects metadata layer (Universes)

·         Developed BusinessObjects Reports and Administered various user-groups for different Universes

·         Designed and developed ETL programs using UNIX,PL-SQL and SQL-Loader utilities.

·         Tuned PL/SQL programs and Database tables (by adding partitioning, bitmap indexes and materialized views) for enhanced Query performance.

Environment :              BusinessObjects 4.1.4 , Oracle 8.0.5/7.3 ,UNIX,PL-SQL,SQL-Loader,Autosys,ConnectDirect

Client : Housing Finance Company, India                                                                                             Jul 98 -Apr 99 

Development of  Application System for GRUH (Gujarat Rural Housing Finance Ltd.),India.

Module Leader, Financial Accounting System Module                                                             

The project was about developing  Enterprisewide Application system for the housing finance company including all the functional areas like Housing finance , Fixed Deposits,Financial Accounting System ,Budgeting , HRD etc.

Responsibilities:             

·         Performed System Design with focus on financial accounting sub-module

·         Developed and Tested  Application Software using  Forms and Reports designer

·         Contributed to Effort Estimation for future phases and enhancements

·         Carried out Quality Assurance and Documentation

·         Executed Module testing of FAS (Financial Accounting System) module. This role involved the testing of components (Forms and Reports) for the functional and business flow related validations.

Environment:              Designer 2000, Developer 2000 (forms 4.5, Reports 2.5), Oracle 7.3 , Windows NT 4.0

Client : State Finance Corporation, India                                                                                                          Apr 97 - Jul 98  

Development of  Management Information System For GSFC (Gujarat State Financial Corp)

Analyst/Developer, Financial Accounting Module                                                                                   

GSFC is the state owned corporation engaged in providing the financial assistance to small and mid-level industries. This project involved designing and implementing an integrated application covering all core activities of GSFC like Appraisal, Disbursement and Recovery of loans , Personnel, Financial Accounting,Corporate Planning, Legal, Inspection, Administration etc.

Responsibilities:             

·         Performed Requirements Analysis (utilizing CASE tool Designer 2000) with focus on Accounting subsystem.

·         Prepared Software Requirements Specification (SRS) document (for Legal and Inspection Module)

·         Prepared Design Specifications  (High Level Design and Low Level Design) for FAS module

·         Designed and Developed forms and Reports for FAS module using Developer/Designer 2000.

Environment: Designer 2000, Developer 2000 (forms 4.5, Reports 2.5), Oracle 7.3, Windows NT 4.0.

 

Jaymin_Patel_0701.doc                                                                                                                                                                                            Page 1 of 7



Additional Info

BACK TO TOP

 

Desired Salary/Wage:

100.00 - 130.00 USD hr

Current Career Level:

Manager (Manager/Supervisor of Staff)

Date of Availability:

Within 2 weeks

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

None

US Military Service:

Citizenship:

Permanent resident

 

 

Target Job:

Target Job Title:

Big Data/DW Architect

Desired Job Type:

Temporary/Contract/Project

Desired Status:

Full-Time
Part-Time
Per Diem

 

Target Company:

Company Size:

Industry:

Energy and Utilities
Retail
Banking
Insurance
Management Consulting Services
Computer Software
Healthcare Services
Computer/IT Services
Financial Services

Occupation:

IT/Software Development

·         Database Development/Administration

·         Enterprise Software Implementation & Consulting

·         Software/Web Development

·         Systems Analysis - IT

 

Target Locations:

Selected Locations:

US
US-NC-Charlotte

Relocate:

No

Willingness to travel:

Up to 75% travel